surprisal$526509$ - definizione. Che cos'è surprisal$526509$
Diclib.com
Dizionario ChatGPT
Inserisci una parola o una frase in qualsiasi lingua 👆
Lingua:

Traduzione e analisi delle parole tramite l'intelligenza artificiale ChatGPT

In questa pagina puoi ottenere un'analisi dettagliata di una parola o frase, prodotta utilizzando la migliore tecnologia di intelligenza artificiale fino ad oggi:

  • come viene usata la parola
  • frequenza di utilizzo
  • è usato più spesso nel discorso orale o scritto
  • opzioni di traduzione delle parole
  • esempi di utilizzo (varie frasi con traduzione)
  • etimologia

Cosa (chi) è surprisal$526509$ - definizione

LOGARITHMIC QUANTITY DERIVED FROM THE PROBABILITY OF A PARTICULAR EVENT
Surprisal; Shannon information; Surprisals; Self information; Self-inform; Self inform; Self-informs; Self informs; Self-informed; Self informed; Self-informing; Self informing; Self-entropy; Self-information; Shannon information content

Surprisal analysis         
  • A schematic of “Surprisal Analysis".
  • A schematic of “Surprisal Analysis".
Wikipedia talk:Articles for creation/Surprisal Analysis; Surprisal Analysis
Surprisal analysis is an information-theoretical analysis technique that integrates and applies principles of thermodynamics and maximal entropy. Surprisal analysis is capable of relating the underlying microscopic properties to the macroscopic bulk properties of a system.
Surprisal         
·noun The act of surprising, or state of being surprised; surprise.
Information content         
In information theory, the information content, self-information, surprisal, or Shannon information is a basic quantity derived from the probability of a particular event occurring from a random variable. It can be thought of as an alternative way of expressing probability, much like odds or log-odds, but which has particular mathematical advantages in the setting of information theory.

Wikipedia

Information content

In information theory, the information content, self-information, surprisal, or Shannon information is a basic quantity derived from the probability of a particular event occurring from a random variable. It can be thought of as an alternative way of expressing probability, much like odds or log-odds, but which has particular mathematical advantages in the setting of information theory.

The Shannon information can be interpreted as quantifying the level of "surprise" of a particular outcome. As it is such a basic quantity, it also appears in several other settings, such as the length of a message needed to transmit the event given an optimal source coding of the random variable.

The Shannon information is closely related to entropy, which is the expected value of the self-information of a random variable, quantifying how surprising the random variable is "on average". This is the average amount of self-information an observer would expect to gain about a random variable when measuring it.

The information content can be expressed in various units of information, of which the most common is the "bit" (more correctly called the shannon), as explained below.